PromtEngineer / localGPT

Chat with your documents on your local device using GPT models. No data leaves your device and 100% private.
Apache License 2.0
19.94k stars 2.22k forks source link

data privacy #365

Open bp020108 opened 1 year ago

bp020108 commented 1 year ago

How can we block the data fetch from the internet? If there is no find from local "source documents" directory then it should not answer anything or error "no data found" ? this is ensure that there is no leak of local directory data to outside. Can you please help

PromtEngineer commented 1 year ago

Its not really looking for data on the internet even if it can't find an answer in your local documents. All the answers are generated based on the model weights that are locally on your machine (after downloading the model). You can modify the prompt template to add the behavior of "no data found". But wanted to reiterate, the code/model is not sending/receiving any data from the internet.

bp020108 commented 1 year ago

So then how i am seeing response for questions which are not in source document?

I have uploaded only 1 document in source document directory but i am still seeing response for other questions so is it going on internet to get the answer? Model cannot have all answers locally right?

Please help here. This project has zero chance of data leak which are there in local source directory?

I am hosting this project on local server.

On Mon, Aug 14, 2023, 8:17 PM PromptEngineer @.***> wrote:

Its not really looking for data on the internet even if it can't find an answer in your local documents. All the answers are generated based on the model weights that are locally on your machine (after downloading the model). You can modify the prompt template to add the behavior of "no data found". But wanted to reiterate, the code/model is not sending/receiving any data from the internet.

— Reply to this email directly, view it on GitHub https://github.com/PromtEngineer/localGPT/issues/365#issuecomment-1678258413, or unsubscribe https://github.com/notifications/unsubscribe-auth/BB3XG3BS6APWDLY4J3KL3TLXVK52RANCNFSM6AAAAAA3QIGFSQ . You are receiving this because you authored the thread.Message ID: @.***>

PromtEngineer commented 1 year ago

The model will have limited knowledge and can derive answers from its existing knowledge. Its not accessing internet.

What I would recommend you doing is to disconnect internet and then chat with the model. That probably will be the best test.

bp020108 commented 1 year ago

Thanks for your replyand clarification.

I have already blocked the internet and I was getting the response so i thought I am still not able to block internet by firewall completely but i think it is accessing local knowledge.

I thought how much information or knowledge can have in the local small model so I was not clear.

Do you have any documents or links to understand how this model (not big stoarge) can answer any random questions?

And I am looking forward to wait your new video for local API (so that open in web browser with IP address) and connecting with open source GUI model (gardio or something for better user experience) so that we can deploy internally for our own data and access the information easily.

On Mon, Aug 14, 2023, 8:27 PM PromptEngineer @.***> wrote:

The model will have limited knowledge and can derive answers from its existing knowledge. Its not accessing internet.

What I would recommend you doing is to disconnect internet and then chat with the model. That probably will be the best test.

— Reply to this email directly, view it on GitHub https://github.com/PromtEngineer/localGPT/issues/365#issuecomment-1678264620, or unsubscribe https://github.com/notifications/unsubscribe-auth/BB3XG3AVH6SXD6FQBUNKTLLXVK67PANCNFSM6AAAAAA3QIGFSQ . You are receiving this because you authored the thread.Message ID: @.***>

bp020108 commented 1 year ago

is there a plan to install gradio for GUI for API?