Closed dnnane closed 2 weeks ago
Hi @dnnane! Thanks for raising an issue on the AlloyDB Python Connector 😄
What you are experiencing is most likely the equivalent of https://github.com/GoogleCloudPlatform/cloud-sql-python-connector/issues/830#issuecomment-1699141112
The Python Connector schedules background tasks to call AlloyDB APIs to get a valid certificate for the SSL connection as well as to retrieve metadata such as the IP address to connect to.
If you initialize the Connector
as a regular global variable it will timeout because Cloud Functions only allocates CPU at the time of the first request, during the request context (aka when the action
function is called). This is why Cloud Functions recommends lazy initializing global variables.
You could try something along the lines of:
from google.cloud.alloydb.connector import Connector
# lazy init Connector
connector = None
# function to return the database connection
def getconn():
conn = connector.connect(
"projects/<YOUR_PROJECT>/locations/<YOUR_REGION>/clusters/<YOUR_CLUSTER>/instances/<YOUR_INSTANCE>",
"pg8000",
user="postgres",
password="postgres",
db="postgres"
)
return conn
def action(request):
if connector is None:
connector = Connector()
print(getconn())
Even using the Connector
as a lazy global variable may lead to some issues if Cloud Functions scales down to zero as again the background refreshes of the Python Connector may be throttled. This is currently a known issue and I created #298 to track adding a lazy refresh option where you can configure the Connector to not schedule background refreshes.
Since you are using Private IP connection you may be interested in not using the Python Connector and instead connecting directly to your instance using its Private IP address until the limitations are fixed.
Example would be using psycopg2:
import psycopg2
def action(request):
# see https://www.psycopg.org/docs/usage.html for usage
conn = psycopg2.connect(
dbname="test",
user="postgres",
password="secret",
host="<YOUR-ALLOYDB-PRIVATE-IP>",
port=5432
)
# Open a cursor to perform database operations
cur = conn.cursor()
# perform a query
cur.execute("SELECT NOW()")
Hi,
Thanks for your quick response, I thought that I responded before but I didn't, sorry.
It seems to be a problem with GCP serverless services behavior rather than the package itself, in my case for example, using a serverless vpc access solved the problem.
Sorry for the inconvenience, if you have no more insights (I actually discovered this throttling problem with your comment), I will close the issue.
Thanks @dnnane. Let's close this.
Bug Description
I cannot get this connector working in a cloud function, i get a
time out
error.I am following steps outlined in this codelab using a connection through a VPC Serverless Access to an AlloyDB instance in a cluster.
I tried the plain codelab, it is working.
When using your connector in a Cloud Function, I am unable to connect (
time out
).My code in the cloud function is as follow:
Do you have more insights ? I think that error messages shall be more clear,
time out
means almost nothing.Example code (or command)
Steps to reproduce?
Environment
Additional Details
No response