run-llama / llama-hub

A library of data loaders for LLMs made by the community -- to be used with LlamaIndex and/or LangChain
https://llamahub.ai/
MIT License
3.44k stars 731 forks source link

[Bug]: I can't have llama-index and s3fs at the same time #835

Open fedsp opened 8 months ago

fedsp commented 8 months ago

Bug Description

I am trying to use llama-index with the s3 reader on a AWS lambda function (container image on ECR). I need to create a dockerfile with llama-index and s3fs at the same time (keep in mind that aws lambda pre-built image already have boto3 installed)

Version

0.9.24

Steps to Reproduce

Steps to reproduce:

This is my dockerfile: FROM public.ecr.aws/lambda/python:3.11

Install dependencies

RUN pip install llama-index RUN pip install s3fs COPY app.py ${LAMBDA_TASK_ROOT} CMD [ "app.lambda_handler" ]

This is my code: from llama_index import VectorStoreIndex, download_loader S3Reader = download_loader("S3Reader",custom_path='/tmp')

Expected result: To run smoothly with the latest versions of everything

Actual result: During the execution of the download_loader line, it make some pip installs, which returns this error: ContextualVersionConflict: (botocore 1.33.13 (/var/lang/lib/python3.11/site-packages), Requirement.parse('botocore<1.32.0,>=1.31.83'), {'boto3'})

Relevant Logs/Tracbacks

No response