Closed jpzhangvincent closed 1 year ago
I think Cannot allocate write+execute memory for ffi.callback().
is because of a build time flag.
Could you for testing purposes switch to the Python 3.9 distributed by brew
?
This comment seems to suggest that the issue might be caused by not using libffi
that ships with macos. Could you try rebuilding your Python executable and make sure it's built with that libffi
? Then maybe install cffi
from source to force it to use the system's libffi
as well.
This comment seems to suggest that the issue might be caused by not using
libffi
that ships with macos. Could you try rebuilding your Python executable and make sure it's built with thatlibffi
? Then maybe installcffi
from source to force it to use the system'slibffi
as well.
Can you give you some pointer on how to rebuild/check the Python executable with libffi
? I'm using the conda environment.
@sfc-gh-mkeller while Homebrew Python 3.9 might resolve this, Snowflake's own Snowpark doesn't support Python 3.9 (and doesn't look like it's going to any time soon), and Homebrew Python 3.8 doesn't resolve the issue.
This seems to be a big blocker for M1 Mac users. Any chance to prioritize to fix this?
I believe this is a direct consequence of the problem (with a suggested solution) described in #857
snowflake-connector-python will not run on M1 Macs (Silicon, Arm processor): See https://github.com/snowflakedb/snowflake-connector-python/issues/799 and dbt-labs/dbt#3722. That error is due to https://github.com/pyca/pyopenssl/issues/873. https://github.com/pyca/pyopenssl/issues/873#issuecomment-915146615 implies that a fix will not be forthcoming, and that it would be better to swap out the relevant pyopenssl functionality for Python's built-in SSL library.
I believe this is a direct consequence of the problem (with a suggested solution) described in #857
snowflake-connector-python will not run on M1 Macs (Silicon, Arm processor): See #799 and dbt-labs/dbt#3722. That error is due to pyca/pyopenssl#873. pyca/pyopenssl#873 (comment) implies that a fix will not be forthcoming, and that it would be better to swap out the relevant pyopenssl functionality for Python's built-in SSL library.
@sfc-gh-mkeller What is the proposed solution here then? This is a huge blocker for all M1 Mac users.
EDIT: I am using conda-forge to install my environments. I can confirm that uninstalling the cffi package and reinstalling using pip seems to solve this issue.
conda uninstall --force cffi
pipe install cffi
Does anyone know why that would fix this issue, and how it can be resolved so that package can be installed through conda? Using pip is not a solution for me.
Folks, we have documented a workaround for this issue for Python3.8. Do try it out and let us know.
Folks, we have documented a workaround for this issue for Python3.8. Do try it out and let us know.
Thanks for the update, @sfc-gh-achandrasekaran!
Note that the first command in the guide, as it stands, leads to an error:
$ CONDA_SUBDIR=osx-64 conda create -n snowpark python=3.8 -c https://repo.anaconda.com/pkgs/snowflake numpy pandas
usage: conda [-h] [-V] command ...
conda: error: unrecognized arguments: numpy pandas
It should probably be like this:
CONDA_SUBDIR=osx-64 conda create -n snowpark python=3.8 numpy pandas -c https://repo.anaconda.com/pkgs/snowflak
Yep, that is it. We will update the docs. Thank you
FYI, I've got it working now. Thanks for providing a workaround.
Unfortunately, this workaround doesn't help when attempting to pair snowpark with tensorflow on an M1. tensorflow (installed via miniforge) requires osx-arm64
. For the moment, I'm using two environments which isn't ideal but just thought I'd point out this edge case
I have get around this problem by updating my python version to 3.10 ( I've created a new conda environment with 3.10) and install snowflake-connector-python on it
I have get around this problem by updating my python version to 3.10 ( I've created a new conda environment with 3.10) and install snowflake-connector-python on it
@anthonnyO Can we install snowflake-connector-python in 3.10? I tried to use pip but got the following error.
$pip install snowflake-snowpark-python ERROR: Ignored the following versions that require a different python version: 0.10.0 Requires-Python ==3.8.; 0.11.0 Requires-Python ==3.8.; 0.12.0 Requires-Python ==3.8.; 0.6.0 Requires-Python ==3.8.; 0.7.0 Requires-Python ==3.8.; 0.8.0 Requires-Python ==3.8.; 0.9.0 Requires-Python ==3.8.; 1.0.0 Requires-Python ==3.8. ERROR: Could not find a version that satisfies the requirement snowflake-snowpark-python (from versions: none) ERROR: No matching distribution found for snowflake-snowpark-python
@ironerumi Anthony was talking about snowflake-connector-python
while you are talking about snowflake-snowpark-python
The root cause of this issue has been found and fixed by https://github.com/conda-forge/cffi-feedstock/pull/47/files
The cffi
package with build number 3 from both conda-forge and Anaconda should have the issue fixed. A description of what was the issue is described at: https://foss.heptapod.net/pypy/cffi/-/blob/branch/default/c/_cffi_backend.c#L64-89
Can we close this issue?
@iamontheinet we want to make sure customers are no longer running into this before closing it.
Thank you @sfc-gh-mkeller! I was able to fix this by installing cffi >= 1.15.1=*_3
in my Conda environment:
channels:
- 'conda-forge'
- 'defaults'
dependencies:
- 'python'
- 'cffi >= 1.15.1=*_3'
- 'snowflake-connector-python'
# etc.
Note that Conda Forge has builds for Python 3.8 - 3.10, but apparently not yet for 3.11, for this particular build revision.
@jordantshaw It does on the the miniforge too thanks for the help
https://github.com/conda-forge/cffi-feedstock/pull/47 was just merged and cffi=1.15.1=*_3
is now also available from conda-forge
. This fixes the issue for me :partying_face:
Thank you!
Hello,
I'm unable to fix this error on M1 Mac.
conda create -y -q --no-default-packages --name conda_env python=3.8 virtualenv
conda activate conda_env
python -m venv --system-site-packages venv
source venv/bin/activate
python -m pip install -U -q dbt-snowflake
python -m pip install --force-reinstall "cffi>=1.15.1"
Even if if I do the installation of cffi
during the conda env creation:
conda create -y -q --no-default-packages --name conda_env python=3.8 virtualenv cffi=1.15.1=*_3
Thanks @jpzhangvincent for your issue.
@4sushi I had no issues on my M1 with the YAML I posted above. Try cffi>=1.15.1=*_3
instead of cffi=1.15.1=*_3
, and try quoting the *
to prevent the shell from interpreting it.
conda create -p test-env python 'cffi>=1.15.1=*_3' snowflake-connector-python
Creating a virtualenv inside a Conda env also seems redundant and unnecessary.
Edit: Does CFFI 1.15.1 v3 support Python 3.8? I am using 3.10 in my project, but I know e.g. Snowpark still requires 3.8.
EDIT: I am using conda-forge to install my environments. I can confirm that uninstalling the cffi package and reinstalling using pip seems to solve this issue.
conda uninstall --force cffi pipe install cffi
Using python 3.10.4 this solution worked for me
This has since been fixed.
Hie can anybody please help me with this
I am trying to run
client = pymongo.MongoClient("mongodb+srv://Aamir:
and getting the error MemoryError: Cannot allocate write+execute memory for ffi.callback(). You might be running on a system that prevents this. For more information, see https://cffi.readthedocs.io/en/latest/using.html#callbacks
I'm using a Macbook M1 with a conda environment for Python 3.10.4.
And running conda install "cffi >= 1.15.1=*_3"
fixed this issue for me :)
PD: @aliamirr try this
I am using macbook m2 and I am currently using python 3.8
Running conda install "cffi >= 1.15.1=*_3"
also works for me.
This issue and two similar ones (https://github.com/snowflakedb/snowflake-connector-python/issues/986 and https://github.com/snowflakedb/snowflake-connector-python/issues/799) are closed (and some of those threads describe the issue as "fixed"), but it seems that they still occur, and at least some users still need to add an extra step of installing a particular cffi version to work around it. There is a workaround.
FWIW I still hit this issue with the latest connector version. Below I create a new conda environment on my M1 Mac and try using the latest snowflake connector for python 3.8 (I end up with the latest version, 3.7.0).
conda create -n snowflake-connector-python-m1 python=3.8
conda activate snowflake-connector-python-m1
pip install snowflake-connector-python
pip install ipython
ipython
import snowflake.connector
snowflake.connector.connect()
I get a stack trace ending with MemoryError: Cannot allocate write+execute memory for ffi.callback(). You might be running on a system that prevents this. For more information, see https://cffi.readthedocs.io/en/latest/using.html#callbacks
.
System:
Please answer these questions before submitting your issue. Thanks!
What version of Python are you using?
Python 3.9.13 | packaged by conda-forge
What operating system and processor architecture are you using?
macOS-10.16-arm64-arm-64bit
What are the component versions in the environment (
pip freeze
)?cffi==1.15.1 cryptography==37.0.4 snowflake-sqlalchemy==1.3.4 snowflake-connector-python==2.7.9 requests==2.28.1 SQLAlchemy==1.4.39
What did you do? I installed the package with 'pip install snowflake-connector-python' and then simply tried to run a query with the created snowflow connector engine. But I got the error
Please help! I had spent a lot of time debugging and tried different workaround. For example, using the suggestion from this post
I suspect it's related to cffi since when I run
pip install cffi
I still got the errorThanks a lot!