As the design specification makes clear, the source directories are expected to include main.py, requirements.txt, and __init__.py. Additional local dependencies (i.e., code) may be specified so long as their imports are within the source directory, as described here. This precludes the importing of sibling or parent directories.
In this directory setup, main.py can import internal.code_internal and, if base/ has been added to the PythonPath, can import base.code_sibling. The limitations (by design) of Cloud Functions does not allow this latter import, as only the contents of functions/f/ will be deployed to its servers. My question refers to workarounds and uses of symbolic links.
In conventional Python, a symbolic link can be added to functions/f/ which points to base/, and which then makes the contents of base/ able to be imported as though they are directly in the functions/f/ directory: in other words, as though the file functions/f/base/code_sibling.py exists. However, this improvement does not change the deployment behavior of Cloud Functions: the symbolic link is (seemingly ignored by gcloud functions deploy. Instead, I am finding myself directly copying the base/ directory into functions/f/, then deploying the Cloud Function, then deleting the copied files of functions/f/base/.
Is support for symbolic links going to be supported, or are there other workarounds that better address the situation?
Thank you.
As the design specification makes clear, the source directories are expected to include
main.py
,requirements.txt
, and__init__.py
. Additional local dependencies (i.e., code) may be specified so long as their imports are within the source directory, as described here. This precludes the importing of sibling or parent directories.In this directory setup,
main.py
canimport internal.code_internal
and, ifbase/
has been added to the PythonPath, canimport base.code_sibling
. The limitations (by design) of Cloud Functions does not allow this latter import, as only the contents offunctions/f/
will be deployed to its servers. My question refers to workarounds and uses of symbolic links.In conventional Python, a symbolic link can be added to
functions/f/
which points tobase/
, and which then makes the contents ofbase/
able to be imported as though they are directly in thefunctions/f/
directory: in other words, as though the filefunctions/f/base/code_sibling.py
exists. However, this improvement does not change the deployment behavior of Cloud Functions: the symbolic link is (seemingly ignored bygcloud functions deploy
. Instead, I am finding myself directly copying thebase/
directory intofunctions/f/
, then deploying the Cloud Function, then deleting the copied files offunctions/f/base/
.Is support for symbolic links going to be supported, or are there other workarounds that better address the situation? Thank you.