Open laozhai507 opened 6 months ago
Hey @laozhai507! 👋 I'm Dosu, a friendly bot here to help you squash bugs, answer your questions, and guide you on contributing while we wait for a human maintainer to chime in. I've just caught sight of your issue with the HuggingFaceEmbedding
call. Hang tight; I'm diving into the details and will get back to you with some insights shortly! 🤖🔍
I think this is more of an issue with your env / OS? This works fine on Google colab and my Mac for example
That's weird... My OS is MacOS 14.4.1 with M2 Max, env is 3.11.9. Could you please advise me how to solve this issue? Thanks a lot! @logan-markewich
To address the segmentation fault you're encountering with HuggingFaceEmbedding(model_name='BAAI/bge-large-en-v1.5')
, let's focus on a few targeted steps:
Check Transformers Library Version: Ensure your Transformers library is at least version 4.37.0 but less than 5.0.0, as this range is compatible with LlamaIndex v0.10.36. Update if necessary.
Python Version Compatibility: You're using Python 3.11.9, which might not be fully supported by all your dependencies. Consider downgrading to Python 3.8 or 3.9, as these versions are more commonly supported and could potentially resolve the issue.
Environment Testing: Given your MacOS 14.4.1 with M2 Max setup, try running your script in a different environment, such as a Linux-based system, to rule out any specific incompatibilities with your current setup.
Enable Debugging: For more insight into the crash, enable DEBUG
logging level in your application and use LlamaDebugHandler
for detailed error messages and tracebacks. This can provide more clues as to what might be causing the segmentation fault.
These steps should help in diagnosing and potentially resolving the segmentation fault issue. If the problem persists, providing more detailed information about your setup, including the exact versions of your dependencies, could help in further troubleshooting.
To continue the conversation, mention @dosu.
python 3.9 has same error, will try linux later @dosu
If downgrading to Python 3.9 didn't resolve the issue, it's good that you're considering testing on Linux next. In the meantime, ensure that your Transformers library is indeed within the specified version range compatible with your current setup. Additionally, if you haven't already, try updating all related dependencies to their latest versions that are still within the compatibility range for LlamaIndex v0.10.36. This includes libraries that HuggingFaceEmbedding
might depend on, directly or indirectly.
After updating dependencies, if the issue persists on MacOS, testing on Linux will be a crucial step. It will help determine if the problem is specific to the MacOS environment or if it's more broadly related to the code or library versions. When you do test on Linux, ensure that the environment is as similar as possible to your MacOS setup in terms of Python version and installed libraries to accurately pinpoint the cause.
Lastly, if the segmentation fault still occurs on Linux, consider enabling core dumps to capture and analyze the crash details. This can provide valuable insights into what's happening at the moment of the crash. To enable core dumps, you can use the command ulimit -c unlimited
before running your script in the terminal. After a crash, you'll find a core dump file in the directory where you ran the script, which can be analyzed with debugging tools like gdb
in Linux.
Keep us posted on your findings after testing on Linux, and if there are any new error messages or behaviors observed, please share them for further assistance.
To continue the conversation, mention @dosu.
Bug Description
When execute
HuggingFaceEmbedding(model_name='BAAI/bge-large-en-v1.5')
, the script exits abnormally.Version
0.10.36
Steps to Reproduce
Simply execute
HuggingFaceEmbedding(model_name='BAAI/bge-large-en-v1.5')
.Relevant Logs/Tracbacks