GoogleCloudPlatform / cloud-sql-nodejs-connector

A JavaScript library for connecting securely to your Cloud SQL instances
Apache License 2.0
73 stars 8 forks source link

Update README example for passing custom credentials #272

Closed TerenceSweeneyP closed 10 months ago

TerenceSweeneyP commented 10 months ago

I am having a problem connecting my Vercel hosted back end to my cloudSQL instance when I use GoogleAuth instance as credentials with cloud-sql-connector.

I have posted on stack overflow however it was suggested I post here by Jack Wotherspoon. Here is the question on SO for background.

When I set the GOOGLE_APPLICATION_CREDENTIALS env variable to the path where the service credentials JSON is stored, and use :

const connector = new Connector({
});

I can connect.

However, I need to be able to load the service credentials JSON directly from environment variable (not from the path).

When I try this:

const connector = new Connector({
auth: new GoogleAuth.fromJSON(keys),
});

I can't make a connection. Am I missing something?

Here is my full DB.js file:

import mysql from 'mysql2/promise';
import {Connector} from '@google-cloud/cloud-sql-connector';
import {GoogleAuth} from 'google-auth-library';

let pool;

// load the environment variable with our keys
const keysEnvVar = process.env.CREDENTIALS_JSON
if (!keysEnvVar) {
  throw new Error('The CREDENTIALS_JSON environment variable was not found!');
}
const keys = JSON.parse(keysEnvVar);
console.log("keys=",keys)

const connector = new Connector({
auth: new GoogleAuth.fromJSON(keys),
});

const clientOpts = await connector.getOptions({
instanceConnectionName: 'survey-engine-301112:australia-southeast2:database-australia',
ipType: 'PUBLIC',
scope: ['https://www.googleapis.com/auth/cloud-platform'],
});

if (!pool){
pool = await mysql.createPool({
...clientOpts,
user: process.env.DBUSER,
password: process.env.DBPASSWORD,
});
}

const conn = await pool.getConnection();
const [result] = await conn.query( `SELECT NOW();`);
console.table(result); // prints returned time value from server
conn.release();

export default pool

Here is my output from console.log("keys=",keys). I have hashed out some of the details.

keys= {
  type: 'service_account',
  project_id: 'survey-engine-301112',
  private_key_id: 'd65599b32daaa8d9d59f6e6221a16e62d6c2ae9f',
  private_key: '-----BEGIN PRIVATE KEY-----\n' +
    'MIIEvwIBADANBgkqhkiG9w0BAQEFAASCBKkwggSlAgEAAoIBAQDkYpkZMXRJQfp+\n' +
    '#########################\n'+    
    'cnTc7caSvsgRFpeLCMxV5REeQw==\n' +
    '-----END PRIVATE KEY-----\n',
  client_email: 'backend-to-db@survey-engine-301112.iam.gserviceaccount.com',
  client_id: '10#################22',
  auth_uri: 'https://accounts.google.com/o/oauth2/auth',
  token_uri: 'https://oauth2.googleapis.com/token',
  auth_provider_x509_cert_url: 'https://www.googleapis.com/oauth2/v1/certs',
  client_x509_cert_url: 'https://www.googleapis.com/robot/v1/metadata/x509/backend-to-db%40survey-engine-301112.iam.gserviceaccount.com',
  universe_domain: 'googleapis.com'
}

I am using cloud-sql-connector version 1.2.0, google-auth-library version 9.2.0, mysql2 version 3.6.2 and node version 20.10.0

jackwotherspoon commented 10 months ago

Hi @TerenceSweeneyP Thanks for opening this issue! πŸ˜„

I will update our README example and get you a better example to showcase passing custom credentials to the Node Connector. As you have pointed out, the sample we showcase in our README doesn't really showcase the intended use-case very well at all. We should have a better example that showcases actually specifying a service account that differs from the Application Default Credentials.

jackwotherspoon commented 10 months ago

Also @TerenceSweeneyP do you mind sharing the actual error you are getting? πŸ˜„ Will help to reproduce and debug

TerenceSweeneyP commented 10 months ago

Hi @jackwotherspoon. Updating the readme would probably be helpful. I am not sure whether my use is correct.

Currently I am trying:

  const auth = new GoogleAuth({
    credentials: keys,
    scope: ['https://www.googleapis.com/auth/cloud-platform'],
});
const connector = new Connector({
    auth:auth
});

This doesn't throw an error, but I am only getting a connection when I have GOOGLE_APPLICATION_CREDENTIALS set in my .env, so it isn't using the credentials from the GoogleAuth instance.

edosrecki commented 10 months ago

@TerenceSweeneyP

I believe the issue is the following. You are passing loginAuth: GoogleAuth, so the first branch executes, which uses default GoogleAuth (and thus tries to load your GOOGLE_APPLICATION_CREDENTIALS). See the code here:

https://github.com/GoogleCloudPlatform/cloud-sql-nodejs-connector/blob/c84e20aae0053404ab0e1778ea858e027d545e8b/src/sqladmin-fetcher.ts#L83-L91

To fix the issue, I believe you should pass loginAuth: AuthClient, for example:

const auth = new GoogleAuth({
    credentials: keys,
    scope: ['https://www.googleapis.com/auth/cloud-platform'],
});

const authClient = await auth.getClient();

const connector = new Connector({
    auth: authClient
});

However, I didn't try this out, I am just writing based on reading the code.

ruyadorno commented 10 months ago

@edosrecki I believe you're right, it looks like we need to update the README docs to point to usage of AuthClient first since that's most likely to set the right scope required for users and avoid this type of confusion.

@danielbankhead what do you think? Was there any other reason for documenting setting GoogleAuth instead that I may be missing here?

TerenceSweeneyP commented 10 months ago

@edosrecki I have tried your code snippet and I get the following error:

- error Error: error:1E08010C:DECODER routines::unsupported
    at Sign.sign (node:internal/crypto/sig:131:29)
    at Object.sign (/Users/terencesweeney/Coding/Back End/nextbe/node_modules/jwa/index.js:152:45)
    at Object.jwsSign [as sign] (/Users/terencesweeney/Coding/Back End/nextbe/node_modules/jws/lib/sign-stream.js:32:24)
    at JWTAccess.getRequestHeaders (/Users/terencesweeney/Coding/Back End/nextbe/node_modules/google-auth-library/build/src/auth/jwtaccess.js:123:31)
    at JWT.getRequestMetadataAsync (/Users/terencesweeney/Coding/Back End/nextbe/node_modules/google-auth-library/build/src/auth/jwtclient.js:79:51)
    at JWT.requestAsync (/Users/terencesweeney/Coding/Back End/nextbe/node_modules/google-auth-library/build/src/auth/oauth2client.js:368:34)
    at JWT.request (/Users/terencesweeney/Coding/Back End/nextbe/node_modules/google-auth-library/build/src/auth/oauth2client.js:362:25)
    at GoogleAuth.request (/Users/terencesweeney/Coding/Back End/nextbe/node_modules/google-auth-library/build/src/auth/googleauth.js:697:23) {
  digest: undefined

I get where you are coming from though. When I look at the code from cloud-sql-nodejs-connector/src/sqladmin-fetcher.ts it seems I have to pass in an AuthClient instance (if I want to provide actual credentials).

I have played around and tried numerous options from the docs including:

const authClient = new auth.fromJSON(keys);
const connector = new Connector({
    auth: authClient
});

I cant get anything to work though.

edosrecki commented 10 months ago

@TerenceSweeneyP

For me this works:

import fs from 'fs'
import { Connector } from '@google-cloud/cloud-sql-connector'
import { auth } from 'google-auth-library'

console.log('This should be undefined:', process.env.GOOGLE_APPLICATION_CREDENTIALS)

const authClient = auth.fromJSON(JSON.parse(
  fs.readFileSync(PATH_TO_CREDENTIALS, 'utf-8')
))
const connector = new Connector({ auth: authClient })
TerenceSweeneyP commented 10 months ago

@edosrecki Thanks. I need to load the actual credentials themselves from environment variables though, not from the path. That is why I am going down this route.

I tried:

  const authClient = auth.fromJSON(keys) // As keys has already been parsed
  const connector = new Connector({ auth: authClient })

But it gives: - error Error: error:1E08010C:DECODER routines::unsupported

I went back at tried from the path with your code (as a test) however it also wouldn't connect- it doesn't give an error it just doesn't connect. This is strange because when I add the same path as GOOGLE_APPLICATION_CREDENTIALS and just use: const connector = new Connector() It works fine.

TerenceSweeneyP commented 10 months ago

Okay, as a test, I have noticed that when I set GOOGLE_APPLICATION_CREDENTIALS and just use: const connector = new Connector() It connects fine. However when I add the other code above it like this:

const authClient = auth.fromJSON(JSON.parse(
    fs.readFileSync(PATH_TO_CREDENTIALS, 'utf-8')
  ))
  const connector = new Connector()

It no longer connects. Which seems strange. I am going to try and isolate which piece of the code is stopping it connecting.

edosrecki commented 10 months ago

@TerenceSweeneyP

But it gives: - error Error: error:1E08010C:DECODER routines::unsupported

This has to be an issue with private_key field format in your keys variable. Can you try Base64-encoding your JSON key and storing it as such in your environment variable. Then Base64-decode it before passing it to auth.fromJSON().

jackwotherspoon commented 10 months ago

@TerenceSweeneyP I can confirm that I just got the following working which is what I think you are looking for and builds on what @edosrecki mentioned above πŸ˜„ , just adding the scopes:

import mysql from 'mysql2/promise';
import {Connector} from '@google-cloud/cloud-sql-connector';
import {auth} from 'google-auth-library';

// load the environment variable with our keys
const keysEnvVar = process.env.CREDENTIALS_JSON
if (!keysEnvVar) {
  throw new Error('The CREDENTIALS_JSON environment variable was not found!');
}
const keys = JSON.parse(keysEnvVar);
console.log("keys=",keys);

const authClient = auth.fromJSON(keys);
authClient.scopes = ['https://www.googleapis.com/auth/sqlservice.admin'];

const connector = new Connector({
    auth: authClient,
});

const clientOpts = await connector.getOptions({
    instanceConnectionName: 'my-project:my-region:my-instance',
    ipType: 'PUBLIC',
});

const pool = await mysql.createPool({
    ...clientOpts,
    user: process.env.DBUSER,
    password: process.env.DBPASSWORD,
    database: '',
});

const conn = await pool.getConnection();
const [result] = await conn.query( `SELECT NOW();`);
console.table(result); // prints returned time value from server

await pool.end()
connector.close()

This is also the same as the following, so you can choose which you prefer:

import mysql from 'mysql2/promise';
import {Connector} from '@google-cloud/cloud-sql-connector';
import {GoogleAuth} from 'google-auth-library';

// load the environment variable with our keys
const keysEnvVar = process.env.CREDENTIALS_JSON
if (!keysEnvVar) {
  throw new Error('The CREDENTIALS_JSON environment variable was not found!');
}
const keys = JSON.parse(keysEnvVar);
console.log("keys=",keys)

const auth = new GoogleAuth({
    scopes: ['https://www.googleapis.com/auth/sqlservice.admin']
});

const connector = new Connector({
    auth: auth.fromJSON(keys),
});

const clientOpts = await connector.getOptions({
    instanceConnectionName: 'my-project:my-instance:my-instance',
    ipType: 'PUBLIC',
});

const pool = await mysql.createPool({
    ...clientOpts,
    user: process.env.DBUSER,
    password: process.env.DBPASSWORD,
    database: '',
});

const conn = await pool.getConnection();
const [result] = await conn.query( `SELECT NOW();`);
console.table(result); // prints returned time value from server

await pool.end()
connector.close()
edosrecki commented 10 months ago

just adding the scopes:

I would like to point out that adding scopes does not seem to be necessary since they are added in SQLAdminFetcher.

On additional note, I would like to share my opinion that Connector's constructor API is extremely confusing, given that auth can be an instance of 2 different types (GoogleAuth, AuthClient), and that depending on the provided type, connector chooses between two very different execution paths.

For example: the property is called auth (in Connector class, loginAuth in SQLAdminFetcher), but it is only used for loginAuth when its type is GoogleAuth. On the other hand, it is used in both adminAuth and loginAuth it its type is AuthClient.

jackwotherspoon commented 10 months ago

I would like to point out that adding scopes does not seem to be necessary since they are added in SQLAdminFetcher.

Interesting, if I remove the scopes from my sample I get an unauthorized error... @ruyadorno does this sound like a bug to you? If I am passing in a custom AuthClient should i be granting it the proper scopes in my code or should the Connector be applying the scopes?

edosrecki commented 10 months ago

I would like to point out that adding scopes does not seem to be necessary since they are added in SQLAdminFetcher.

Interesting, if I remove the scopes from my sample I get an unauthorized error... @ruyadorno does this sound like a bug to you? If I am passing in a custom AuthClient should i be granting it the proper scopes in my code or should the Connector be applying the scopes?

My bad, after checking the google-auth-library code, scopes which are passed to GoogleAuth constructor are only applied if authClient is NOT passed to it. So yes, they need to be set, I stand corrected πŸ‘

danielbankhead commented 10 months ago

@danielbankhead what do you think? Was there any other reason for documenting setting GoogleAuth instead that I may be missing here?

I think this PR missed a commit to support the supplied GoogleAuth for loginAuth:

As seen here: https://github.com/GoogleCloudPlatform/cloud-sql-nodejs-connector/blob/c84e20aae0053404ab0e1778ea858e027d545e8b/src/sqladmin-fetcher.ts#L83-L87

I'll whip up a small PR to resolve.

jackwotherspoon commented 10 months ago

@danielbankhead is there a design reason we want to support both types GoogleAuth and AuthClient instead of just supporting one of the types (GoogleAuth)?

danielbankhead commented 10 months ago

@danielbankhead is there a design reason we want to support both types GoogleAuth and AuthClient instead of just supporting one of the types?

Yep, customers can use GoogleAuth if they have a shared/general configuration they would like to use throughout their application and don't have a preference for which auth client to use. AuthClient is useful for customers that would like to fine-tune their credentials configuration, or even pass in a custom auth client, to satisfy their unique needs.

TerenceSweeneyP commented 10 months ago

Hi Guys

I finally got this working.

@edosrecki In regards to "- error Error: error:1E08010C:DECODER routines::unsupported" you are right. This is an error with private-key format (can't believe I didn't google this earlier).

@jackwotherspoon Yes. Both of your examples work.

In retrospect, not passing the credentials in the correct format (as a string) is what was causing me the biggest problem. It is obvious now (given that the code includes JSON.parse()) however it took me a lot of trial and error to get this in my bones.

Thanks a lot for your help.

TerenceSweeneyP commented 10 months ago

As an aside, I don't think the credentials in the example here will work with the code snippet given there. JSON.parse() will not work with the contents of this environment variable. Before adding the credentials to an environment variable, the credentials first need to be either:

  1. Turned into a string or
  2. Turned into a string then base 64 encoded (as per some recommendations)

I think this example needs to be changed. I am open to being corrected if I have missed something though.

jackwotherspoon commented 10 months ago

As an aside, I don't think the credentials in the example here will work with the code snippet given there. JSON.parse() will not work with the contents of this environment variable. Before adding the credentials to an environment variable, the credentials first need to be either:

  1. Turned into a string or
  2. Turned into a string then base 64 encoded (as per some recommendations)

I think this example needs to be changed. I am open to being corrected if I have missed something though.

@TerenceSweeneyP the credentials in the example you linked actually are a string πŸ˜‰ They sneakily have single quotes around the JSON when they run $ export CREDS='{...}' so I think it does work, I used that example and it worked for me.

TerenceSweeneyP commented 10 months ago

@jackwotherspoon It may work for you, but I think the example doesn't work for the scenario given. The example specifies the use case as working with "systems that deploy directly from source control (Heroku, App Engine, etc)".

The example uses '$ export CREDS' which I believe is how you would create an environment variable in a Bash shell or start-up script. I may be wrong but I don't think this is possible in the environments given. When I load my environment variables in my local development machine, it is from a .env file, and when I deploy to production (in Vercel) I load them in the dashboard.

I am not sure how the example works for you in your environment, however the example infers that if you take the JSON from the service key file, wrap it in single quotes, then it becomes a string that you can apply JSON.parse to. However this doesnt work (you can try here) This was a source of confusion for me when working through the docs, and I think it may also be confusing for others who expect the example to work for the use case given.

edosrecki commented 10 months ago

however the example infers that if you take the JSON from the service key file, wrap it in single quotes, then it becomes a string that you can apply JSON.parse to.

If I may join the conversation... I don't think that example suggest that. It just shows one way (in the shell) to set the environment variable which contains service account JSON credentials. This is typical throughout documentation, it doesn't try to cover every possible environment (that would be rather complicated) in which users might deploy their code.

Typical issue when loading JSON credentials in the environment is the formatting of the private_key, which contains newlines, which can then wrongly get copied as a literal \n string, and this will in turn cause parsing of the key to fail.

That's why the best practice is to Base64-encode the key and store it like that in your environment variable:

cat service_account_key.json | base64 -w0
TerenceSweeneyP commented 10 months ago

@edosrecki Okay. I get where you are coming from.

I have read over my last comment and realized it probably reads as brusque, and not relevant. I apologize for that.

It took some time for me to work out how to get the credentials in the right format and I found the way credentials were shown in the example confusing. I think it would have been clearer for me if the environment variable was not shown in the example, and there was a paragraph explaining that the environment variable needed to be stored as a string( and possibly base64 encoded). Then I could have researched how to do that in my environment.

This may be just me. Others may find it clearer. If I can save someone else work though, it would be worthwhile.

I am happy to leave the conversation and get back to focussing on my project.

Thanks for your support. Are you fine with my closing the question as completed?