aws-samples / serverless-pdf-chat

LLM-powered document chat using Amazon Bedrock and AWS Serverless
https://aws.amazon.com/blogs/compute/building-a-serverless-document-chat-with-aws-lambda-and-amazon-bedrock/
MIT No Attribution
228 stars 206 forks source link

Breaking change in LangChain/FAISS #45

Closed datasith closed 1 month ago

datasith commented 5 months ago

For folks running into:

 ValueError: The de-serialization relies loading a pickle file. Pickle files can
 be modified to deliver a malicious payload that results in execution of
 arbitrary code on your machine.You will need to set
 `allow_dangerous_deserialization` to `True` to enable deserialization. If you do
 this, make sure that you trust the source of the data. For example, if you are
 loading a file that you created, and no that no one else has modified the file,
 then this is safe to do. Do not set this to `True` if you are loading a file
 from an untrusted source (e.g., some random site on the internet.

It is due to a recent, intentional code-breaking change.

Update this line in the generate response Lambda:

faiss_index = FAISS.load_local("/tmp", embeddings)

to:

faiss_index = FAISS.load_local("/tmp", embeddings, allow_dangerous_deserialization=True)

and re-deploy in any preferred way.

Also, please take a moment to consider why this change was made, and the security risks associated with loading pickle files.

moserda commented 1 month ago

Hi, since this has been resolved in c4a8d2b I'll close this issue.